64 research outputs found

    Reconstructing the Forest of Lineage Trees of Diverse Bacterial Communities Using Bio-inspired Image Analysis

    Full text link
    Cell segmentation and tracking allow us to extract a plethora of cell attributes from bacterial time-lapse cell movies, thus promoting computational modeling and simulation of biological processes down to the single-cell level. However, to analyze successfully complex cell movies, imaging multiple interacting bacterial clones as they grow and merge to generate overcrowded bacterial communities with thousands of cells in the field of view, segmentation results should be near perfect to warrant good tracking results. We introduce here a fully automated closed-loop bio-inspired computational strategy that exploits prior knowledge about the expected structure of a colony's lineage tree to locate and correct segmentation errors in analyzed movie frames. We show that this correction strategy is effective, resulting in improved cell tracking and consequently trustworthy deep colony lineage trees. Our image analysis approach has the unique capability to keep tracking cells even after clonal subpopulations merge in the movie. This enables the reconstruction of the complete Forest of Lineage Trees (FLT) representation of evolving multi-clonal bacterial communities. Moreover, the percentage of valid cell trajectories extracted from the image analysis almost doubles after segmentation correction. This plethora of trustworthy data extracted from a complex cell movie analysis enables single-cell analytics as a tool for addressing compelling questions for human health, such as understanding the role of single-cell stochasticity in antibiotics resistance without losing site of the inter-cellular interactions and microenvironment effects that may shape it

    StochSoCs: High performance biocomputing simulations for large scale Systems Biology

    Full text link
    The stochastic simulation of large-scale biochemical reaction networks is of great importance for systems biology since it enables the study of inherently stochastic biological mechanisms at the whole cell scale. Stochastic Simulation Algorithms (SSA) allow us to simulate the dynamic behavior of complex kinetic models, but their high computational cost makes them very slow for many realistic size problems. We present a pilot service, named WebStoch, developed in the context of our StochSoCs research project, allowing life scientists with no high-performance computing expertise to perform over the internet stochastic simulations of large-scale biological network models described in the SBML standard format. Biomodels submitted to the service are parsed automatically and then placed for parallel execution on distributed worker nodes. The workers are implemented using multi-core and many-core processors, or FPGA accelerators that can handle the simulation of thousands of stochastic repetitions of complex biomodels, with possibly thousands of reactions and interacting species. Using benchmark LCSE biomodels, whose workload can be scaled on demand, we demonstrate linear speedup and more than two orders of magnitude higher throughput than existing serial simulators.Comment: The 2017 International Conference on High Performance Computing & Simulation (HPCS 2017), 8 page

    Many-Core CPUs Can Deliver Scalable Performance to Stochastic Simulations of Large-Scale Biochemical Reaction Networks

    Get PDF
    Stochastic simulation of large-scale biochemical reaction networks is becoming essential for Systems Biology. It enables the in-silico investigation of complex biological system dynamics under different conditions and intervention strategies, while also taking into account the inherent "biological noise" especially present in the low species count regime. It is however a great computational challenge since in practice we need to execute many repetitions of a complex simulation model to assess the average and extreme cases behavior of the dynamical system it represents. The problem's work scales quickly, with the number of repetitions required and the number of reactions in the bio-model. The worst case scenario s when there is a need to run thousands of repetitions of a complex model with thousands of reactions. We have developed a stochastic simulation software framework for many- and multi-core CPUs. It is evaluated using Intel's experimental many-cores Single-chip Cloud Computer (SCC) CPU and the latest generation consumer grade Core i7 multi-core Intel CPU, when running Gillespie's First Reaction Method exact stochastic simulation algorithm. It is shown that emerging many-core NoC processors can provide scalable performance achieving linear speedup as simulation work scales in both dimensions

    Predictive modeling of the spatiotemporal evolution of an environmental hazard and its sensor network implementation

    Get PDF
    Predicting accurately the spatiotemporal evolution of a diffusive environmental hazard is of paramount importance for its effective containment. We approximate the front line of a hazard with a set of line segments (local front models). We model the progression characteristics of these front segments by appropriately modified 2D Gaussian functions. The modified Gaussian model parameters are adjusted based on the solution of a Kullback-Leibler (KL) divergence minimization problem. The whole scheme can be realized by a wireless sensor network by forming dynamically triplets of cooperating sensor nodes along the path of the hazard. It is shown that the algorithm can track effectively the front characteristics (in terms of direction and speed) even in the presence of faulty sensor nodes

    Collaborative sensor network algorithm for predicting the spatiotemporal evolution of hazardous phenomena

    Get PDF
    We present a novel decentralized Wireless Sensor Network (WSN) algorithm which can estimate both the speed and direction of an evolving diffusive hazardous phenomenon (e.g. a wildfire, oil spill, etc.). In the proposed scheme we approximate a progressing hazard’s front as a set of line segments. The spatiotemporal evolution of each line segment is modeled by a modified 2D Gaussian function. As the phenomenon evolves, the parameters of this model are updated based on the analytical solution of a Kullback – Leibler (KL) divergence minimization problem. This leads to an efficient WSN distributed parameters estimation algorithm that can be implemented by dynamically formed clusters (triplets) of collaborating sensor nodes. Computer simulations show that our approach is able to track the evolving phenomenon with reasonable accuracy even if a percentage of sensors fails due to the hazard and/or the phenomenon has a time varying speed

    Scalable FPGA accelerator of the NRM algorithm for efficient stochastic simulation of large-scale biochemical reaction networks

    Get PDF
    Stochastic simulation of large-scale biochemical reaction networks, with thousands of reactions, is important for systems biology and medicine since it will enable the insilico experimentation with genome-scale reconstructed networks. FPGA based stochastic simulation accelerators can exploit parallelism, but have been limited on the size of biomodels they can handle. We present a high performance scalable System on Chip architecture for implementing Gibson and Bruck's Next Reaction Method efficiently in reconfigurable hardware. Our MPSoC uses aggressive pipelining at the core level and also combines many cores into a Network on Chip to also execute in parallel stochastic repetitions of complex biomodels, each one with up to 4K reactions. The performance of our NRM core depends only on the average outdegree of the biomodel's Dependencies Graph (DG) and not on the number of DG nodes (reactions). By adding cores to the NoC, the system's performance scales linearly and reaches GCycles/sec levels. We show that a medium size FPGA running at ~200 MHz deliver high speedup gains relative to a popular and efficient software simulator running on a very powerful workstation PC

    Estimating the spatiotemporal evolution characteristics of diffusive hazards using wireless sensor networks

    Get PDF
    There is a fast growing interest in exploiting Wireless Sensor Networks (WSNs) for tracking the boundaries and predicting the evolution properties of diffusive hazardous phenomena (e.g. wildfires, oil slicks etc.) often modeled as “continuous objects”. We present a novel distributed algorithm for estimating and tracking the local evolution characteristics of continuous objects. The hazard’s front line is approximated as a set of line segments, and the spatiotemporal evolution of each segment is modeled by a small number of parameters (orientation, direction and speed of motion). As the hazard approaches, these parameters are re-estimated using adhoc clusters (triplets) of collaborating sensor nodes. Parameters updating is based on algebraic closed-form expressions resulting from the analytical solution of a Bayesian estimation problem. Therefore, it can be implemented by microprocessors of the WSN nodes, while respecting their limited processing capabilities and strict energy constraints. Extensive computer simulations demonstrate the ability of the proposed distributed algorithm to estimate accurately the evolution characteristics of complex hazard fronts under different conditions by using reasonably dense WSNs. The proposed in-network processing scheme does not require sensor node clocks synchronization and is shown to be robust to sensor node failures and communication link failures, which are expected in harsh environments

    Visualizing Meta-Features in Proteomic Maps

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The steps of a high-throughput proteomics experiment include the separation, differential expression and mass spectrometry-based identification of proteins. However, the last and more challenging step is inferring the biological role of the identified proteins through their association with interaction networks, biological pathways, analysis of the effect of post-translational modifications, and other protein-related information.</p> <p>Results</p> <p>In this paper, we present an integrative visualization methodology that allows combining experimentally produced proteomic features with protein meta-features, typically coming from meta-analysis tools and databases, in synthetic Proteomic Feature Maps. Using three proteomics analysis scenarios, we show that the proposed visualization approach is effective in filtering, navigating and interacting with the proteomics data in order to address visually challenging biological questions. The novelty of our approach lies in the ease of integration of any user-defined proteomic features in easy-to-comprehend visual representations that resemble the familiar 2D-gel images, and can be adapted to the user's needs. The main capabilities of the developed VIP software, which implements the presented visualization methodology, are also highlighted and discussed.</p> <p>Conclusions</p> <p>By using this visualization and the associated VIP software, researchers can explore a complex heterogeneous proteomics dataset from different perspectives in order to address visually important biological queries and formulate new hypotheses for further investigation. VIP is freely available at <url>http://pelopas.uop.gr/~egian/VIP/index.html</url>.</p

    Simulation-driven emulation of collaborative algorithms to assess their requirements for a large-scale WSN implementation

    Get PDF
    Assessing how the performance of a decentralized wireless sensor network (WSN) algorithm's implementation scales, in terms of communication and energy costs, as the network size increases is an essential requirement before its field deployment. Simulations are commonly used for this purpose, especially for large-scale environmental monitoring applications. However, it is difficult to evaluate energy consumption, processing and memory requirements before the algorithm is really ported to a real WSN platform. We propose a method for emulating the operation of collaborative algorithms in large-scale WSNs by re-using a small number of available real sensor nodes. We demonstrate the potential of the proposed simulation-driven WSN emulation approach by using it to estimate how communication and energy costs scale with the network’s size when implementing a collaborative algorithm we developed in for tracking the spatiotemporal evolution of a progressing environmental hazard
    • …
    corecore